Skip to main content

Alexander Huth: Mapping semantic representation in the brain using natural language

Movies Preview

movies
Alexander Huth: Mapping semantic representation in the brain using natural language


Published November 12, 2014


Talk by Alex Huth, of the Gallant lab at UC Berkeley.  Given to the Redwood Center for Theoretical Neuroscience at UC Berkeley.

Abstract
Human beings have the unique ability to extract the meaning, or semantic content, from spoken language. Yet little is known about how the semantic content of everyday narrative speech is represented in brain. We used a new fMRI-based approach to show that semantic information is represented in complex cortical maps that are highly consistent across subjects. Using BOLD data collected while subjects listened to several hours of natural narrative stories, we constructed voxel-wise semantic regression models that accurately predict BOLD responses based on semantic features extracted from the stories. These semantic features were defined using a statistical word co-occurrence model. We then used a novel Bayesian generative model of cortical maps to discover how the representations revealed by voxel-wise modeling are organized across the cortical sheet. The results of these analyses show that the semantic content of narrative speech is represented across parietal cortex, prefrontal cortex, and temporal cortex in complex maps comprising dozens of semantically selective brain areas.


Language English

comment
Reviews

There are no reviews yet. Be the first one to write a review.
SIMILAR ITEMS (based on metadata)
Community Video
by Sam Vaknin
movies
eye 292
favorite 0
comment 0
Community Texts
texts
eye 151
favorite 2
comment 0
Community Video
movies
eye 1,050
favorite 2
comment 0
Community Video
by Carey G. Butler
movies
eye 22
favorite 0
comment 0
Community Video
movies
eye 4,976
favorite 0
comment 0
TED Talks
movies
eye 126
favorite 0
comment 0